Markov chain Hebbian learning algorithm with ternary synaptic units

نویسندگان

  • Guhyun Kim
  • Vladimir Kornijcuk
  • Dohun Kim
  • Inho Kim
  • Jaewook Kim
  • Hyo Cheon Woo
  • Ji-Hun Kim
  • Cheol Seong Hwang
  • Doo Seok Jeong
چکیده

Markov chain Hebbian learning algorithm with ternary synaptic units Guhyun Kim, Vladimir Kornijcuk, Dohun Kim, Inho Kim, Jaewook Kim, Hyo Cheon Woo, Ji Hun Kim, Cheol Seong Hwang*, and Doo Seok Jeong* Center for Electronic Materials, Korea Institute of Science and Technology, Hwarangno 14-gil 5, Seongbuk-gu, 02792 Seoul, Republic of Korea Department of Materials Science and Engineering and Inter-University Semiconductor Research Centre, Seoul National University, 151-744 Seoul, Republic of Korea Department of Nanomaterials, Korea University of Science and Technology, Daejeon, Republic of Korea

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Role of the site of synaptic competition and the balance of learning forces for Hebbian encoding of probabilistic Markov sequences

The majority of distinct sensory and motor events occur as temporally ordered sequences with rich probabilistic structure. Sequences can be characterized by the probability of transitioning from the current state to upcoming states (forward probability), as well as the probability of having transitioned to the current state from previous states (backward probability). Despite the prevalence of ...

متن کامل

Dynamic Cell Structures

Dynamic Cell Structures (DCS) represent a family of artificial neural architectures suited both for unsupervised and supervised learning. They belong to the recently [Martinetz94] introduced class of Topology Representing Networks (TRN) which build perlectly topology preserving feature maps. DCS empI'oy a modified Kohonen learning rule in conjunction with competitive Hebbian learning. The Kohon...

متن کامل

An Approximation of the Error Backpropagation Algorithm in a Predictive Coding Network with Local Hebbian Synaptic Plasticity

To efficiently learn from feedback, cortical networks need to update synaptic weights on multiple levels of cortical hierarchy. An effective and well-known algorithm for computing such changes in synaptic weights is the error backpropagation algorithm. However, in this algorithm, the change in synaptic weights is a complex function of weights and activities of neurons not directly connected wit...

متن کامل

Hebbian learning and development.

Hebbian learning is a biologically plausible and ecologically valid learning mechanism. In Hebbian learning, 'units that fire together, wire together'. Such learning may occur at the neural level in terms of long-term potentiation (LTP) and long-term depression (LTD). Many features of Hebbian learning are relevant to developmental theorizing, including its self-organizing nature and its ability...

متن کامل

A Model of Clipped Hebbian Learning in a Neocortical Pyramidal Cell

A detailed compartmental model of a cortical pyramidal cell is used to determine the effect of the spatial distribution of synapses across the dendritic tree on the pattern recognition capability of the neu-ron. By setting synaptic strengths according to the clipped Hebbian learning rule used in the associative net neural network model, the cell is able to recognise input patterns, but with a o...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1711.08679  شماره 

صفحات  -

تاریخ انتشار 2017